Skip to content

fix(opencode): support responses wire mode for openai-compatible gpt-5#16255

Closed
andyWang1688 wants to merge 2 commits intoanomalyco:devfrom
andyWang1688:fix/openai-compatible-gpt5-summary
Closed

fix(opencode): support responses wire mode for openai-compatible gpt-5#16255
andyWang1688 wants to merge 2 commits intoanomalyco:devfrom
andyWang1688:fix/openai-compatible-gpt5-summary

Conversation

@andyWang1688
Copy link
Contributor

@andyWang1688 andyWang1688 commented Mar 6, 2026

Issue for this PR

Closes #16154

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

  • Fixes gpt-5 + @ai-sdk/openai-compatible default behavior that sent incompatible reasoningSummary on chat-style requests.
  • Adds optional wireApi: \"responses\" (or wire_api) for openai-compatible providers.
  • In responses mode, routes provider options under openai and uses responses model loading so reasoning options map correctly.

How did you verify your code works?

  • Added/updated tests in packages/opencode/test/provider/transform.test.ts for:
    • default openai-compatible mode omitting reasoningSummary
    • responses wire mode preserving reasoningSummary
    • provider options routing to openai key in responses mode
  • CI validates typecheck/unit/e2e matrix on the PR.

Screenshots / recordings

N/A (non-UI change)

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

@github-actions github-actions bot added needs:compliance This means the issue will auto-close after 2 hours. and removed needs:compliance This means the issue will auto-close after 2 hours. labels Mar 6, 2026
@github-actions
Copy link
Contributor

github-actions bot commented Mar 6, 2026

Thanks for updating your PR! It now meets our contributing guidelines. 👍

@github-actions
Copy link
Contributor

github-actions bot commented Mar 6, 2026

The following comment was made by an LLM, it may be inaccurate:

Based on my search, I found one potentially related PR:

Related PR:

The current PR (#16255) appears to be a refinement/alternative solution to the issue previously tackled in #14783, adding support for an optional wireApi: "responses" mode for openai-compatible providers to properly handle reasoning options.

@andyWang1688
Copy link
Contributor Author

andyWang1688 commented Mar 6, 2026

Thanks! I know this overlaps with #14783 on the core bug fix (avoiding reasoningSummary on openai-compatible chat requests).

This PR keeps that same default fix, and additionally adds an optional responses path via provider..options.wireApi = "responses" (wire_api also supported) for providers that support /v1/responses.

If maintainers prefer the smallest possible change, I can split this and keep only the minimal bug fix in this PR (or open a follow-up for the responses-wire part). Happy to adjust to whichever merge strategy you prefer.

@rekram1-node
Copy link
Collaborator

left a comment on the issue, also if ur on discord feel free to @ me in dev channel and we can talk faster

@andyWang1688
Copy link
Contributor Author

Thanks for the guidance. I resolved this on my side by switching responses-based custom models to model-level provider.npm: "@ai-sdk/openai" (instead of using @ai-sdk/openai-compatible for those models). I’m closing this PR to avoid overlap/noise with the ongoing generic fixes.

One request: it would really help to clarify this in docs (when to use @ai-sdk/openai vs @ai-sdk/openai-compatible, and that model-level npm override is supported).

@andyWang1688
Copy link
Contributor Author

Closing this since the issue is resolved for me via config: use model-level for responses-based custom models.\n\nDocs clarification request remains: please document when to use vs , and explicitly mention model-level npm override.

@andyWang1688
Copy link
Contributor Author

Closing note (formatted): this is resolved for me via config by using model-level provider.npm = "@ai-sdk/openai" for responses-based custom models.

Docs ask: please clarify when to use @ai-sdk/openai vs @ai-sdk/openai-compatible, and explicitly mention that model-level npm override is supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: gpt-5 defaults + @ai-sdk/openai-compatible path sends incompatible reasoning params

2 participants